Goto

Collaborating Authors

 North Sea


Bafta games awards 2025: full list of winners

The Guardian

In a video game year dominated by dark, bloody fantasy adventures – and continued job losses and studio closures – it was a cute robot that stole the night at the 2025 Bafta video game awards. Sony's family-friendly platformer Astro Bot won in five categories at yesterday evening's ceremony, including best game and game design. The rest of the awards were evenly spread across a range of Triple A and independent titles. Oil rig thriller Still Wakes the Deep was the next biggest winner with three awards: new intellectual property, performer in a leading role and performer in a supporting role. Clearly actors looking for Bafta-winning roles need look no further than the North Sea.


Dynamical-generative downscaling of climate model ensembles

Lopez-Gomez, Ignacio, Wan, Zhong Yi, Zepeda-Núñez, Leonardo, Schneider, Tapio, Anderson, John, Sha, Fei

arXiv.org Artificial Intelligence

Regional high-resolution climate projections are crucial for many applications, such as agriculture, hydrology, and natural hazard risk assessment. Dynamical downscaling, the state-of-the-art method to produce localized future climate information, involves running a regional climate model (RCM) driven by an Earth System Model (ESM), but it is too computationally expensive to apply to large climate projection ensembles. We propose a novel approach combining dynamical downscaling with generative artificial intelligence to reduce the cost and improve the uncertainty estimates of downscaled climate projections. In our framework, an RCM dynamically downscales ESM output to an intermediate resolution, followed by a generative diffusion model that further refines the resolution to the target scale. This approach leverages the generalizability of physics-based models and the sampling efficiency of diffusion models, enabling the downscaling of large multi-model ensembles. We evaluate our method against dynamically-downscaled climate projections from the CMIP6 ensemble. Our results demonstrate its ability to provide more accurate uncertainty bounds on future regional climate than alternatives such as dynamical downscaling of smaller ensembles, or traditional empirical statistical downscaling methods. We also show that dynamical-generative downscaling results in significantly lower errors than bias correction and spatial disaggregation (BCSD), and captures more accurately the spectra and multivariate correlations of meteorological fields. These characteristics make the dynamical-generative framework a flexible, accurate, and efficient way to downscale large ensembles of climate projections, currently out of reach for pure dynamical downscaling.


An uncertainty-aware Digital Shadow for underground multimodal CO2 storage monitoring

Gahlot, Abhinav Prakash, Orozco, Rafael, Yin, Ziyi, Herrmann, Felix J.

arXiv.org Artificial Intelligence

Geological Carbon Storage GCS is arguably the only scalable net-negative CO2 emission technology available While promising subsurface complexities and heterogeneity of reservoir properties demand a systematic approach to quantify uncertainty when optimizing production and mitigating storage risks which include assurances of Containment and Conformance of injected supercritical CO2 As a first step towards the design and implementation of a Digital Twin for monitoring underground storage operations a machine learning based data-assimilation framework is introduced and validated on carefully designed realistic numerical simulations As our implementation is based on Bayesian inference but does not yet support control and decision-making we coin our approach an uncertainty-aware Digital Shadow To characterize the posterior distribution for the state of CO2 plumes conditioned on multi-modal time-lapse data the envisioned Shadow combines techniques from Simulation-Based Inference SBI and Ensemble Bayesian Filtering to establish probabilistic baselines and assimilate multi-modal data for GCS problems that are challenged by large degrees of freedom nonlinear multi-physics non-Gaussianity and computationally expensive to evaluate fluid flow and seismic simulations To enable SBI for dynamic systems a recursive scheme is proposed where the Digital Shadows neural networks are trained on simulated ensembles for their state and observed data well and/or seismic Once training is completed the systems state is inferred when time-lapse field data becomes available In this computational study we observe that a lack of knowledge on the permeability field can be factored into the Digital Shadows uncertainty quantification To our knowledge this work represents the first proof of concept of an uncertainty-aware in-principle scalable Digital Shadow.


Active Learning for Abrupt Shifts Change-point Detection via Derivative-Aware Gaussian Processes

Zhao, Hao, Pan, Rong

arXiv.org Artificial Intelligence

Change-point detection (CPD) is crucial for identifying abrupt shifts in data, which influence decision-making and efficient resource allocation across various domains. To address the challenges posed by the costly and time-intensive data acquisition in CPD, we introduce the Derivative-Aware Change Detection (DACD) method. It leverages the derivative process of a Gaussian process (GP) for Active Learning (AL), aiming to pinpoint change-point locations effectively. DACD balances the exploitation and exploration of derivative processes through multiple data acquisition functions (AFs). By utilizing GP derivative mean and variance as criteria, DACD sequentially selects the next sampling data point, thus enhancing algorithmic efficiency and ensuring reliable and accurate results. We investigate the effectiveness of DACD method in diverse scenarios and show it outperforms other active learning change-point detection approaches.


Uncertainty and Explainable Analysis of Machine Learning Model for Reconstruction of Sonic Slowness Logs

Wang, Hua, Wu, Yuqiong, Zhang, Yushun, Lai, Fuqiang, Feng, Zhou, Xie, Bing, Zhao, Ailin

arXiv.org Artificial Intelligence

Logs are valuable information for oil and gas fields as they help to determine the lithology of the formations surrounding the borehole and the location and reserves of subsurface oil and gas reservoirs. However, important logs are often missing in horizontal or old wells, which poses a challenge in field applications. In this paper, we utilize data from the 2020 machine learning competition of the SPWLA, which aims to predict the missing compressional wave slowness and shear wave slowness logs using other logs in the same borehole. We employ the NGBoost algorithm to construct an Ensemble Learning model that can predicate the results as well as their uncertainty. Furthermore, we combine the SHAP method to investigate the interpretability of the machine learning model. We compare the performance of the NGBosst model with four other commonly used Ensemble Learning methods, including Random Forest, GBDT, XGBoost, LightGBM. The results show that the NGBoost model performs well in the testing set and can provide a probability distribution for the prediction results. In addition, the variance of the probability distribution of the predicted log can be used to justify the quality of the constructed log. Using the SHAP explainable machine learning model, we calculate the importance of each input log to the predicted results as well as the coupling relationship among input logs. Our findings reveal that the NGBoost model tends to provide greater slowness prediction results when the neutron porosity and gamma ray are large, which is consistent with the cognition of petrophysical models. Furthermore, the machine learning model can capture the influence of the changing borehole caliper on slowness, where the influence of borehole caliper on slowness is complex and not easy to establish a direct relationship. These findings are in line with the physical principle of borehole acoustics.


An introduction to distributed training of deep neural networks for segmentation tasks with large seismic datasets

Birnie, Claire, Jarraya, Haithem, Hansteen, Fredrik

arXiv.org Artificial Intelligence

Deep learning applications are drastically progressing in seismic processing and interpretation tasks. However, the majority of approaches subsample data volumes and restrict model sizes to minimise computational requirements. Subsampling the data risks losing vital spatio-temporal information which could aid training whilst restricting model sizes can impact model performance, or in some extreme cases, renders more complicated tasks such as segmentation impossible. This paper illustrates how to tackle the two main issues of training of large neural networks: memory limitations and impracticably large training times. Typically, training data is preloaded into memory prior to training, a particular challenge for seismic applications where data is typically four times larger than that used for standard image processing tasks (float32 vs. uint8). Using a microseismic use case, we illustrate how over 750GB of data can be used to train a model by using a data generator approach which only stores in memory the data required for that training batch. Furthermore, efficient training over large models is illustrated through the training of a 7-layer UNet with input data dimensions of 4096 4096 ( 7.8M parameters). Through a batch-splitting distributed training approach, training times are reduced by a factor of four. The combination of data generators and distributed training removes any necessity of data subsampling or restriction of neural network sizes, offering the opportunity of utilisation of larger networks, higher-resolution input data or moving from 2D to 3D problem spaces.


Artificial Intelligence: the key to successful decommissioning in the North Sea?

#artificialintelligence

COVID-19, a low oil price and an industry facing increased environmental scrutiny has resulted in a turbulent 2020 for the oil and gas sector. As many North Sea fields reach maturity, stakeholders will be carefully considering their options including decommissioning and diversifying the energy mix. The National Decommissioning Centre (NDC) (a partnership between the University of Aberdeen, the Oil & Gas Technology Centre (OGTC), and industry) has said that efficient late-life management and decommissioning of assets is a "societal and economic necessity". Emerging tech and artificial intelligence (AI) can help achieve this. However, the contribution AI and new technology could have on decommissioning cannot be considered in isolation.


What Happens When You Mix New Solar Tech And Artificial Intelligence? OilPrice.com

#artificialintelligence

The writing is on the wall. Every major global governmental agency is warning of the imminent tipping point towards catastrophic climate change, even the world's largest oil company Saudi Aramco is now talking about reaching peak oil within the next 20 years, and the International Energy Agency projects that it will happen in more like 10. Solar and wind are cheaper than ever, and large-scale solar mega-projects are quickly becoming the norm. It makes sense, then, that even the supermajor oil companies are diversifying their portfolios and investing in their own demise--also known as the renewable energy sector. Way back in July, 2017 Oilprice reported that France's Total S.A. was "leading the charge on renewables". At the time, Total's website boasted: "For Total, contributing to the development of renewable energies is as much a strategic choice as an industrial responsibility. We are doing our part to diversify the global energy mix by investing in renewables, with a strategic focus on solar energy and bioenergies."


What Happens When You Mix New Solar Tech And Artificial Intelligence?

#artificialintelligence

The writing is on the wall. Every major global governmental agency is warning of the imminent tipping point towards catastrophic climate change, even the world's largest oil company Saudi Aramco is now talking about reaching peak oil within the next 20 years, and the International Energy Agency projects that it will happen in more like 10. Solar and wind are cheaper than ever, and large-scale solar mega-projects are quickly becoming the norm. It makes sense, then, that even the supermajor oil companies are diversifying their portfolios and investing in their own demise--also known as the renewable energy sector. Way back in July, 2017 Oilprice reported that France's Total S.A. was "leading the charge on renewables". At the time, Total's website boasted: "For Total, contributing to the development of renewable energies is as much a strategic choice as an industrial responsibility. We are doing our part to diversify the global energy mix by investing in renewables, with a strategic focus on solar energy and bioenergies."


Total Plans to Use Artificial Intelligence to Cut Drilling Costs

#artificialintelligence

Total SA plans to start a digital factory in the coming weeks to tap artificial intelligence in a bid to save hundreds of millions of dollars on exploration and production projects, according to an executive. The use of artificial intelligence to screen geological data will help identify new prospects, and shorten the time to acquire licenses, drill and make discoveries, Arnaud Breuillac, head of E&P, said at a conference organized by IFP Energies Nouvelles in Paris on Friday. It will also help optimize the use of equipment and reduce maintenance costs, he said. The digital factory will employ between 200 and 300 engineers and build on successful North Sea pilot projects, Chief Executive Officer Patrick Pouyanne said at the same event. It will also be a way to attract "young talent" to the industry.

  Country:
  Genre: Press Release (0.82)
  Industry: Energy > Oil & Gas > Upstream (0.93)